Security News
PyPI Introduces Digital Attestations to Strengthen Python Package Security
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
The idb npm package is a small library that mirrors IndexedDB, but features a more user-friendly API. It allows developers to interact with IndexedDB using promises instead of events, which simplifies the code and improves readability and maintainability. It's particularly useful for applications that need to store large amounts of data on the client side.
Database Creation and Versioning
This code sample demonstrates how to create a new IndexedDB database and an object store within it using idb. The `openDB` function is used to open a database, and it takes the database name, version number, and an upgrade callback as arguments. The upgrade callback is executed when the database is created or upgraded.
const dbPromise = idb.openDB('my-database', 1, {
upgrade(db) {
db.createObjectStore('keyval');
}
});
Adding and Retrieving Data
This code sample shows how to add data to and retrieve data from an object store in IndexedDB using idb. The `addData` function creates a transaction, accesses the object store, and uses the `put` method to add data. The `getData` function retrieves data using the `get` method.
const dbPromise = idb.openDB('my-database', 1);
async function addData(key, val) {
const db = await dbPromise;
const tx = db.transaction('keyval', 'readwrite');
const store = tx.objectStore('keyval');
store.put(val, key);
await tx.done;
}
async function getData(key) {
const db = await dbPromise;
return db.transaction('keyval').objectStore('keyval').get(key);
}
Cursor Iteration
This code sample illustrates how to iterate over entries in an object store using a cursor. The `iterateCursor` function opens a read-only transaction, accesses the object store, and uses `openCursor` to start the iteration. The cursor is advanced using the `continue` method.
const dbPromise = idb.openDB('my-database', 1);
async function iterateCursor() {
const db = await dbPromise;
const tx = db.transaction('keyval', 'readonly');
const store = tx.objectStore('keyval');
let cursor = await store.openCursor();
while (cursor) {
console.log(cursor.key, cursor.value);
cursor = await cursor.continue();
}
}
Dexie.js is a wrapper library for IndexedDB that allows for simpler syntax and complex queries. It provides a more powerful query language and schema handling compared to idb, which makes it suitable for more complex applications that require advanced database features.
PouchDB is an open-source JavaScript database inspired by Apache CouchDB that is designed to run well within the browser. It enables applications to store data locally while offline, then synchronize it with CouchDB and compatible servers when the application is back online, unlike idb which is purely a client-side solution.
This is a tiny (~1.19kB brotli'd) library that mostly mirrors the IndexedDB API, but with small improvements that make a big difference to usability.
npm install idb
Then, assuming you're using a module-compatible system (like webpack, Rollup etc):
import { openDB, deleteDB, wrap, unwrap } from 'idb';
async function doDatabaseStuff() {
const db = await openDB(…);
}
<script type="module">
import { openDB, deleteDB, wrap, unwrap } from 'https://cdn.jsdelivr.net/npm/idb@8/+esm';
async function doDatabaseStuff() {
const db = await openDB(…);
}
</script>
<script src="https://cdn.jsdelivr.net/npm/idb@8/build/umd.js"></script>
<script>
async function doDatabaseStuff() {
const db = await idb.openDB(…);
}
</script>
A global, idb
, will be created, containing all exports of the module version.
See details of (potentially) breaking changes.
This library targets modern browsers, as in Chrome, Firefox, Safari, and other browsers that use those engines, such as Edge. IE is not supported.
openDB
This method opens a database, and returns a promise for an enhanced IDBDatabase
.
const db = await openDB(name, version, {
upgrade(db, oldVersion, newVersion, transaction, event) {
// …
},
blocked(currentVersion, blockedVersion, event) {
// …
},
blocking(currentVersion, blockedVersion, event) {
// …
},
terminated() {
// …
},
});
name
: Name of the database.version
(optional): Schema version, or undefined
to open the current version.upgrade
(optional): Called if this version of the database has never been opened before. Use it to specify the schema for the database. This is similar to the upgradeneeded
event in plain IndexedDB.
db
: An enhanced IDBDatabase
.oldVersion
: Last version of the database opened by the user.newVersion
: Whatever new version you provided.transaction
: An enhanced transaction for this upgrade. This is useful if you need to get data from other stores as part of a migration.event
: The event object for the associated upgradeneeded
event.blocked
(optional): Called if there are older versions of the database open on the origin, so this version cannot open. This is similar to the blocked
event in plain IndexedDB.
currentVersion
: Version of the database that's blocking this one.blockedVersion
: The version of the database being blocked (whatever version you provided to openDB
).event
: The event object for the associated blocked
event.blocking
(optional): Called if this connection is blocking a future version of the database from opening. This is similar to the versionchange
event in plain IndexedDB.
currentVersion
: Version of the open database (whatever version you provided to openDB
).blockedVersion
: The version of the database that's being blocked.event
: The event object for the associated versionchange
event.terminated
(optional): Called if the browser abnormally terminates the connection, but not on regular closures like calling db.close()
. This is similar to the close
event in plain IndexedDB.deleteDB
Deletes a database.
await deleteDB(name, {
blocked() {
// …
},
});
name
: Name of the database.blocked
(optional): Called if the database already exists and there are open connections that don’t close in response to a versionchange event, the request will be blocked until they all close.
currentVersion
: Version of the database that's blocking the delete operation.event
: The event object for the associated 'versionchange' event.unwrap
Takes an enhanced IndexedDB object and returns the plain unmodified one.
const unwrapped = unwrap(wrapped);
This is useful if, for some reason, you want to drop back into plain IndexedDB. Promises will also be converted back into IDBRequest
objects.
wrap
Takes an IDB object and returns a version enhanced by this library.
const wrapped = wrap(unwrapped);
This is useful if some third party code gives you an IDBDatabase
object and you want it to have the features of this library.
Once you've opened the database the API is the same as IndexedDB, except for a few changes to make things easier.
Firstly, any method that usually returns an IDBRequest
object will now return a promise for the result.
const store = db.transaction(storeName).objectStore(storeName);
const value = await store.get(key);
The library turns all IDBRequest
objects into promises, but it doesn't know in advance which methods may return promises.
As a result, methods such as store.put
may throw instead of returning a promise.
If you're using async functions, there's no observable difference.
TL;DR: Do not await
other things between the start and end of your transaction, otherwise the transaction will close before you're done.
An IDB transaction auto-closes if it doesn't have anything left do once microtasks have been processed. As a result, this works fine:
const tx = db.transaction('keyval', 'readwrite');
const store = tx.objectStore('keyval');
const val = (await store.get('counter')) || 0;
await store.put(val + 1, 'counter');
await tx.done;
But this doesn't:
const tx = db.transaction('keyval', 'readwrite');
const store = tx.objectStore('keyval');
const val = (await store.get('counter')) || 0;
// This is where things go wrong:
const newVal = await fetch('/increment?val=' + val);
// And this throws an error:
await store.put(newVal, 'counter');
await tx.done;
In this case, the transaction closes while the browser is fetching, so store.put
fails.
IDBDatabase
enhancementsIt's common to create a transaction for a single action, so helper methods are included for this:
// Get a value from a store:
const value = await db.get(storeName, key);
// Set a value in a store:
await db.put(storeName, value, key);
The shortcuts are: get
, getKey
, getAll
, getAllKeys
, count
, put
, add
, delete
, and clear
. Each method takes a storeName
argument, the name of the object store, and the rest of the arguments are the same as the equivalent IDBObjectStore
method.
The shortcuts are: getFromIndex
, getKeyFromIndex
, getAllFromIndex
, getAllKeysFromIndex
, and countFromIndex
.
// Get a value from an index:
const value = await db.getFromIndex(storeName, indexName, key);
Each method takes storeName
and indexName
arguments, followed by the rest of the arguments from the equivalent IDBIndex
method.
IDBTransaction
enhancementstx.store
If a transaction involves a single store, the store
property will reference that store.
const tx = db.transaction('whatever');
const store = tx.store;
If a transaction involves multiple stores, tx.store
is undefined, you need to use tx.objectStore(storeName)
to get the stores.
tx.done
Transactions have a .done
promise which resolves when the transaction completes successfully, and otherwise rejects with the transaction error.
const tx = db.transaction(storeName, 'readwrite');
await Promise.all([
tx.store.put('bar', 'foo'),
tx.store.put('world', 'hello'),
tx.done,
]);
If you're writing to the database, tx.done
is the signal that everything was successfully committed to the database. However, it's still beneficial to await the individual operations, as you'll see the error that caused the transaction to fail.
IDBCursor
enhancementsCursor advance methods (advance
, continue
, continuePrimaryKey
) return a promise for the cursor, or null if there are no further values to provide.
let cursor = await db.transaction(storeName).store.openCursor();
while (cursor) {
console.log(cursor.key, cursor.value);
cursor = await cursor.continue();
}
You can iterate over stores, indexes, and cursors:
const tx = db.transaction(storeName);
for await (const cursor of tx.store) {
// …
}
Each yielded object is an IDBCursor
. You can optionally use the advance methods to skip items (within an async iterator they return void):
const tx = db.transaction(storeName);
for await (const cursor of tx.store) {
console.log(cursor.value);
// Skip the next item
cursor.advance(2);
}
If you don't manually advance the cursor, cursor.continue()
is called for you.
Stores and indexes also have an iterate
method which has the same signature as openCursor
, but returns an async iterator:
const index = db.transaction('books').store.index('author');
for await (const cursor of index.iterate('Douglas Adams')) {
console.log(cursor.value);
}
This is very similar to localStorage
, but async. If this is all you need, you may be interested in idb-keyval. You can always upgrade to this library later.
import { openDB } from 'idb';
const dbPromise = openDB('keyval-store', 1, {
upgrade(db) {
db.createObjectStore('keyval');
},
});
export async function get(key) {
return (await dbPromise).get('keyval', key);
}
export async function set(key, val) {
return (await dbPromise).put('keyval', val, key);
}
export async function del(key) {
return (await dbPromise).delete('keyval', key);
}
export async function clear() {
return (await dbPromise).clear('keyval');
}
export async function keys() {
return (await dbPromise).getAllKeys('keyval');
}
import { openDB } from 'idb/with-async-ittr.js';
async function demo() {
const db = await openDB('Articles', 1, {
upgrade(db) {
// Create a store of objects
const store = db.createObjectStore('articles', {
// The 'id' property of the object will be the key.
keyPath: 'id',
// If it isn't explicitly set, create a value by auto incrementing.
autoIncrement: true,
});
// Create an index on the 'date' property of the objects.
store.createIndex('date', 'date');
},
});
// Add an article:
await db.add('articles', {
title: 'Article 1',
date: new Date('2019-01-01'),
body: '…',
});
// Add multiple articles in one transaction:
{
const tx = db.transaction('articles', 'readwrite');
await Promise.all([
tx.store.add({
title: 'Article 2',
date: new Date('2019-01-01'),
body: '…',
}),
tx.store.add({
title: 'Article 3',
date: new Date('2019-01-02'),
body: '…',
}),
tx.done,
]);
}
// Get all the articles in date order:
console.log(await db.getAllFromIndex('articles', 'date'));
// Add 'And, happy new year!' to all articles on 2019-01-01:
{
const tx = db.transaction('articles', 'readwrite');
const index = tx.store.index('date');
for await (const cursor of index.iterate(new Date('2019-01-01'))) {
const article = { ...cursor.value };
article.body += ' And, happy new year!';
cursor.update(article);
}
await tx.done;
}
}
This library is fully typed, and you can improve things by providing types for your database:
import { openDB, DBSchema } from 'idb';
interface MyDB extends DBSchema {
'favourite-number': {
key: string;
value: number;
};
products: {
value: {
name: string;
price: number;
productCode: string;
};
key: string;
indexes: { 'by-price': number };
};
}
async function demo() {
const db = await openDB<MyDB>('my-db', 1, {
upgrade(db) {
db.createObjectStore('favourite-number');
const productStore = db.createObjectStore('products', {
keyPath: 'productCode',
});
productStore.createIndex('by-price', 'price');
},
});
// This works
await db.put('favourite-number', 7, 'Jen');
// This fails at compile time, as the 'favourite-number' store expects a number.
await db.put('favourite-number', 'Twelve', 'Jake');
}
To define types for your database, extend DBSchema
with an interface where the keys are the names of your object stores.
For each value, provide an object where value
is the type of values within the store, and key
is the type of keys within the store.
Optionally, indexes
can contain a map of index names, to the type of key within that index.
Provide this interface when calling openDB
, and from then on your database will be strongly typed. This also allows your IDE to autocomplete the names of stores and indexes.
If you call openDB
without providing types, your database will use basic types. However, sometimes you'll need to interact with stores that aren't in your schema, perhaps during upgrades. In that case you can cast.
Let's say we were renaming the 'favourite-number' store to 'fave-nums':
import { openDB, DBSchema, IDBPDatabase } from 'idb';
interface MyDBV1 extends DBSchema {
'favourite-number': { key: string; value: number };
}
interface MyDBV2 extends DBSchema {
'fave-num': { key: string; value: number };
}
const db = await openDB<MyDBV2>('my-db', 2, {
async upgrade(db, oldVersion) {
// Cast a reference of the database to the old schema.
const v1Db = db as unknown as IDBPDatabase<MyDBV1>;
if (oldVersion < 1) {
v1Db.createObjectStore('favourite-number');
}
if (oldVersion < 2) {
const store = v1Db.createObjectStore('favourite-number');
store.name = 'fave-num';
}
},
});
You can also cast to a typeless database by omitting the type, eg db as IDBPDatabase
.
Note: Types like IDBPDatabase
are used by TypeScript only. The implementation uses proxies under the hood.
npm run dev
This will also perform type testing.
To test, navigate to build/test/
in a browser. You'll need to set up a basic web server for this.
FAQs
A small wrapper that makes IndexedDB usable
We found that idb demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.